KMID : 1144120230130030465
|
|
Biomedical Engineering Letters 2023 Volume.13 No. 3 p.465 ~ p.473
|
|
Facial electromyogram-based facial gesture recognition for hands-free control of an AR/VR environment: optimal gesture set selection and validation of feasibility as an assistive technology
|
|
Kim Chung-Hwan
Kim Chae-Yoon Kim Hyun-Sub Kwak Hwy-Kuen Lee Woo-Jin Im Chang-Hwan
|
|
Abstract
|
|
|
The rapid expansion of virtual reality (VR) and augmented reality (AR) into various applications has increased the demand for hands-free input interfaces when traditional control methods are inapplicable (e.g., for paralyzed individuals who cannot move their hands). Facial electromyogram (fEMG), bioelectric signals generated from facial muscles, could solve this problem. Discriminating facial gestures using fEMG is possible because fEMG signals vary with these gestures. Thus, these signals can be used to generate discrete hands-free control commands. This study implemented an fEMG-based facial gesture recognition system for generating discrete commands to control an AR or VR environment. The fEMG signals around the eyes were recorded, assuming that the fEMG electrodes were embedded into the VR head-mounted display (HMD). Sixteen discrete facial gestures were classified using linear discriminant analysis (LDA) with Riemannian geometry features. Because the fEMG electrodes were far from the facial muscles associated with the facial gestures, some similar facial gestures were indistinguishable from each other. Therefore, this study determined the best facial gesture combinations with the highest classification accuracy for 3?15 commands. An analysis of the fEMG data acquired from 15 participants showed that the optimal facial gesture combinations increased the accuracy by 4.7%p compared with randomly selected facial gesture combinations. Moreover, this study is the first to investigate the feasibility of implementing a subject-independent facial gesture recognition system that does not require individual user training sessions. Lastly, our online hands-free control system was successfully applied to a media player to demonstrate the applicability of the proposed system.
|
|
KEYWORD
|
|
Facial electromyogram, Facial expression, Virtual reality, Augmented reality, Assistive device
|
|
FullTexts / Linksout information
|
|
|
|
Listed journal information
|
|
|